A Viterbi-like algorithm and EM learning for statistical abduction
نویسندگان
چکیده
We propose statistical abduction as a rstorder logical framework for representing and learning probabilistic knowledge. It combines logical abduction with a parameterized distribution over abducibles. We show that probability computation, a Viterbilike algorithm and EM learning for statistical abduction achieve the same eÆciency as specilzed algorithms for HMMs (hidden Markov models), PCFGs (probabilistic context-free grammars) and sc-BNs (singly connetcted Bayesian networks).
منابع مشابه
Statistical abduction with tabulation1
We propose statistical abduction as a rst-order logical framework for representing, inferring and learning probabilistic knowledge. It semantically integrates logical abduction with a parameterized distribution over abducibles. We show that statistical abduction combined with tabulated search provides an e cient algorithm for probability computation, a Viterbi-like algorithm for nding the most ...
متن کاملTerminology of Combining the Sentences of Farsi Language with the Viterbi Algorithm and BI-GRAM Labeling
This paper, based on the Viterbi algorithm, selects the most likely combination of different wording from a variety of scenarios. In this regard, the Bi-gram and Unigram tags of each word, based on the letters forming the words, as well as the bigram and unigram labels After the breakdown into the composition or moment of transition from the decomposition to the combination obtained from th...
متن کاملGeneralized Baum-Welch and Viterbi Algorithms Based on the Direct Dependency among Observations
The parameters of a Hidden Markov Model (HMM) are transition and emission probabilities‎. ‎Both can be estimated using the Baum-Welch algorithm‎. ‎The process of discovering the sequence of hidden states‎, ‎given the sequence of observations‎, ‎is performed by the Viterbi algorithm‎. ‎In both Baum-Welch and Viterbi algorithms‎, ‎it is assumed that...
متن کاملLearning Stochastic Bracketing Inversion Transduction Grammars with a Cubic Time Biparsing Algorithm
We present a biparsing algorithm for Stochastic Bracketing Inversion Transduction Grammars that runs in O(bn3) time instead of O(n6). Transduction grammars learned via an EM estimation procedure based on this biparsing algorithm are evaluated directly on the translation task, by building a phrase-based statistical MT system on top of the alignments dictated by Viterbi parses under the induced b...
متن کاملGeneralizing a Strongly Lexicalized Parser using Unlabeled Data
Statistical parsers trained on labeled data suffer from sparsity, both grammatical and lexical. For parsers based on strongly lexicalized grammar formalisms (such as CCG, which has complex lexical categories but simple combinatory rules), the problem of sparsity can be isolated to the lexicon. In this paper, we show that semi-supervised Viterbi-EM can be used to extend the lexicon of a generati...
متن کامل